88 research outputs found

    Creating Interaction Scenarios With a New Graphical User Interface

    Full text link
    The field of human-centered computing has known a major progress these past few years. It is admitted that this field is multidisciplinary and that the human is the core of the system. It shows two matters of concern: multidisciplinary and human. The first one reveals that each discipline plays an important role in the global research and that the collaboration between everyone is needed. The second one explains that a growing number of researches aims at making the human commitment degree increase by giving him/her a decisive role in the human-machine interaction. This paper focuses on these both concerns and presents MICE (Machines Interaction Control in their Environment) which is a system where the human is the one who makes the decisions to manage the interaction with the machines. In an ambient context, the human can decide of objects actions by creating interaction scenarios with a new visual programming language: scenL.Comment: 5th International Workshop on Intelligent Interfaces for Human-Computer Interaction, Palerme : Italy (2012

    Study of the Importance of Adequacy to Robot Verbal and Non Verbal Communication in Human-Robot interaction

    Full text link
    The Robadom project aims at creating a homecare robot that help and assist people in their daily life, either in doing task for the human or in managing day organization. A robot could have this kind of role only if it is accepted by humans. Before thinking about the robot appearance, we decided to evaluate the importance of the relation between verbal and nonverbal communication during a human-robot interaction in order to determine the situation where the robot is accepted. We realized two experiments in order to study this acceptance. The first experiment studied the importance of having robot nonverbal behavior in relation of its verbal behavior. The second experiment studied the capability of a robot to provide a correct human-robot interaction.Comment: the 43rd Symposium on Robotics - ISR 2012, Taipei : Taiwan, Province Of China (2012

    Artificial Companion: building a impacting relation

    No full text
    International audienceIn this paper we show that we are in front of an evolution from traditional human-computer interactions to a kind of intense exchange between the human user and new generation of virtual or real systems -Embodied Conversational Agents (ECAs) or affective robots- bringing the interaction to another level, the "relation level". We call these systems "companions" that is to say systems with which the user wants to build a kind of life- long relationship. We thus argue that we need to go beyond the concepts acceptability and believability of system to get closer to human and look for "impact" concept. We will see that this problematic is shared between the community of researchers in Embodied Conversational Agents (ECAs) and in affective robotics fields. We put forward a definition of an "impacting relation" that will enable believable interactive ECAs or robots to become believable impacting companions

    REAL-TIME DETECTION OF THE ACTIVITY OF A DOG

    Get PDF
    International audienceThis paper introduces our preliminary work with assistance dogs. Even when dogs are very well trained some problems may occur in practice, typical examples are dog escaping or running after a cat. Our long term objective is to take advantage of the technology to increase the safety of the dog and its owner. Our first work focuses on the activity classification of the dog. This paper presents preliminary results for recognizing four types of activity: walk, run, lay and sit down. Experiments and results on real data collected with low-cost gyroscopes and accelerometers are presented and discussed

    Increasing communication between a man and a dog

    No full text
    International audienceIn this paper, we present the first results we have concerning our ongoing work on a robotic system embedded on a dog to enrich communication. Two problems are addressed here: How to keep control of a dog when the human does not see it? For dog trained to do some specific activities in particular situation, how to detect this activity? We present here results on controlling the dog by an embedded voice and a real-time recognition of some activities of the dog : walk, seat, run, lying

    Une interaction la plus riche possible et à moindre coût

    No full text
    International audienceLes jeux de stimulation cognitive ont une importance capitale pour ralentir le déclin des personnes atteintes de troubles cognitifs. Une partie de ces jeux appartient au domaine des GUI et montre des limites notamment depuis l'émergence des NUI (passivité de l'utilisateur, interaction moins riche...). Il existe également des jeux dans le domaine des NUI, mais souvent ils utilisent de la technologie onéreuse ou sont spécialisés pour un problème précis. Souvent, il n'est pas possible de modifier les exercices proposés. Cet article propose une solution pour développer des jeux de stimulation dans le domaine des NUI, à bas prix. Le principe est d'utiliser les dispositifs numériques qui sont présents pour faire un jeu attractif et réutilisable dans d'autres domaines. Cet article présente StimCards, un jeu de cartes interactif. Les utilisateurs peuvent créer leur propre carte et posséder une base de questions illimitée. Ce jeu s'adapte donc à tous les domaines, à toutes les applications. Une expérimentation a montré que StimCards est stimulant et accepté par les utilisateurs

    MASL: a Language for Multi-Agent System

    Get PDF
    The classical approach for Multi-Agent System (MAS) Control, especially autonomous and robotic ones, deals first from a microscopic point of view: each agent embed a control program with communication/synchronization primitives that enable cooperation between agents. The emergence of a global behaviour from a macroscopic point of view can only be observed afterwards. In this context, MASL offers a macroscopic and unified approach with heterogeneous and distributed calculations over deliberative, reactive or hybrid agents. In this high level language, regardless of the runtime, each concurrent agent locally decides its participation in a collective execution block named an e-block. Each e-block is an anonymous collective program that runs over an agent network following local conditions. The orchestral mode (scalar, asynchronous, synchronous) is statically fixed by a shared block attribute. The communication use shared memory, events, synchronous messages passing, and asynchronous messages passing. Heterogeneous agents are managed with heritage and polymorphism. Permeability mechanism, dealing with agent autonomy, allows an agent to dynamically filter calls to its interface in respects to the sender position in the e-block hierarchy. In dynamic task allocation of agents, auto failover and recovery, agent replacement in a robot fleet (case of agent failure, loss of a mandatory functionality for the mission) an e-block is an entry point of a collaborative work. In the case of synchronous e-block, the programming paradigm is the data parallel model with iterative task for waves of agents. Finally, MASL offers advances in the field of MAS (dynamic belonging to groups, accuracy of the pace of actions to undertake to enable a desired cooperation) and for the management of errors

    iGrace – Emotional Computational Model for EmI Companion Robot.

    Get PDF
    Chapitre 4We will discuss in this chapter the research in the field of emotional interaction, to maintain a non-verbal interaction with children from 4 to 8 years. This work fits into the EmotiRob project, whose goal is to comfort the children vulnerable and / or in hospitalization with an emotional robot companion. The use of robots in hospitals is still limited; we decided to put forward simple robot architecture and therefore, the emotional expression. In this context, a robot too complex and too voluminous must be avoided. After a study of advanced research on perception and emotional synthesis, it was important to determine the most appropriate way to express emotions in order to have a recognition rate acceptable to our target. Following an experiment on this subject, we were able to determine the degrees of freedom needed for the robot to express the six primary emotions. The second step was the definition and description of our emotional model. In order to have a wide range of expressions, while respecting the number of degrees of freedom, we use the concepts of emotional experiences. They provide us with almost two hundred different behaviors for the model. However we decide as a first step to limit ourselves to only fifty behaviors. This diversification is possible thanks to a mix of emotions linked to the dynamics of emotions. This theoretical model now established, we have started various experiments on a variety of audiences in order to validate the first time in its relevance and the rate of recognition of emotions. The first experiment was performed using a simulator for the capture of speech and the emotional and behavioral synthesis of the robot. This, validates the model assumptions that will be integrated EMI - Emotional Model of Interaction. Future phases of the project will evaluate the robot, both in its expression than in providing comfort to children. We describe the protocols used and present the results for EMI. These experiments will allow us to adjust and adapt the model. We will finish this chapter with a brief description of the robot's architecture, and the improvements to be made for the second version of EMI

    StimCards: interactive and configurable Question and Answer game - Users study conclusion

    No full text
    International audienceThis paper highlights conclusions about six experiments conducted with StimCards, an interactive and configurable Question and Answer game. It has been created in the context of the Robadom project whose goal is to propose a homecare robot for seniors. In this project, StimCards is applied to cognitive stimulation. This game is special because users can create their own questions and their own game scripts, and decide which digital devices will be used to interact with. Two experiments have been realized to evaluate the possibility for users to create game scripts. Two other experiments compared children and seniors. They evaluated StimCards acceptability and the preferred users' computing interlocutor. Results showed that it is so easy to create game scripts that children can do it. Both children and seniors liked StimCards, and children preferred to interact with a robot, rather than a computer or a virtual character
    • …
    corecore